Steepest descent preconditioning for nonlinear GMRES optimization

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Steepest Descent Preconditioning for Nonlinear GMRES Optimization

Steepest descent preconditioning is considered for the recently proposed nonlinear generalized minimal residual (N-GMRES) optimization algorithm for unconstrained nonlinear optimization. Two steepest descent preconditioning variants are proposed. The first employs a line search, while the second employs a predefined small step. A simple global convergence proof is provided for the NGMRES optimi...

متن کامل

Nonsymmetric Preconditioning for Conjugate Gradient and Steepest Descent Methods1

We numerically analyze the possibility of turning off postsmoothing (relaxation) in geometric multigrid when used as a preconditioner in conjugate gradient linear and eigenvalue solvers for the 3D Laplacian. The geometric Semicoarsening Multigrid (SMG) method is provided by the hypre parallel software package. We solve linear systems using two variants (standard and flexible) of the preconditio...

متن کامل

SVM-Optimization and Steepest-Descent Line Search

We consider (a subclass of) convex quadratic optimization problems and analyze decomposition algorithms that perform, at least approximately, steepest-descent exact line search. We show that these algorithms, when implemented properly, are within ǫ of optimality after O(log 1/ǫ) iterations for strictly convex cost functions, and after O(1/ǫ) iterations in the general case. Our analysis is gener...

متن کامل

Steepest Descent and Conjugate Gradient Methods with Variable Preconditioning

We analyze the conjugate gradient (CG) method with variable preconditioning for solving a linear system with a real symmetric positive definite (SPD) matrix of coefficients A. We assume that the preconditioner is SPD on each step, and that the condition number of the preconditioned system matrix is bounded above by a constant independent of the step number. We show that the CG method with varia...

متن کامل

A Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems

In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Numerical Linear Algebra with Applications

سال: 2012

ISSN: 1070-5325

DOI: 10.1002/nla.1837